3 research outputs found

    Visual Analytics of Gaze Data with Standard Multimedia Player

    Get PDF
    With the increasing number of studies, where participants’ eye movements are tracked while watching videos, the volume of gaze data records is growing tremendously. Unfortunately, in most cases, such data are collected in separate files in custom-made or proprietary data formats. These data are difficult to access even for experts and effectively inaccessible for non-experts. Normally expensive or custom-made software is necessary for their analysis. We address this problem by using existing multimedia container formats for distributing and archiving eye-tracking and gaze data bundled with the stimuli data. We define an exchange format that can be interpreted by standard multimedia players and can be streamed via the Internet. We convert several gaze data sets into our format, demonstrating the feasibility of our approach and allowing to visualize these data with standard multimedia players. We also introduce two VLC player add-ons, allowing for further visual analytics. We discuss the benefit of gaze data in a multimedia container and explain possible visual analytics approaches based on our implementations, converted datasets, and first user interviews

    Improving eye-tracking data quality: A framework for reproducible evaluations of detection algorithms

    No full text
    The individual error scores for different pupil detection algorithms as reported in the manuscript "Improving eye-tracking data quality: A framework for reproducible evaluations of detection algorithms

    Authentic Fear Responses in Virtual Reality: A Mobile EEG Study on Affective, Behavioral and Electrophysiological Correlates of Fear

    Get PDF
    Fear is an evolutionary adaption to a hazardous environment, linked to numerous complex behavioral responses, e.g., the fight-or-flight response, suiting their respective environment. However, for the sake of experimental control, fear is mainly investigated under rather artificial laboratory conditions. The latter transform these evolutionary adaptions into artificial responses, like keystrokes. The immersive, multidimensional character of virtual reality (VR) enables realistic behavioral responses, overcoming aforementioned limitations. To investigate authentic fear responses from a holistic perspective, participants explored either a negative or a neutral VR cave. To promote real-life behavior, we built a physical replica of the cave, providing haptic sensations. Electrophysiological correlates of fear-related approach and avoidance tendencies, i.e., frontal alpha asymmetries (FAA) were evaluated. To our knowledge, this is the first study to simultaneously capture complex behavior and associated electrophysiological correlates under highly immersive conditions. Participants in the negative condition exhibited a broad spectrum of realistic fear behavior and reported intense negative affect as opposed to participants in the neutral condition. Despite these affective and behavioral differences, the groups could not be distinguished based on the FAAs for the greater part of the cave exploration. Taking the specific behavioral responses into account, the obtained FAAs could not be reconciled with well-known FAA models. Consequently, putting laboratory-based models to the test under realistic conditions shows that they may not unrestrictedly predict realistic behavior. As the VR environment facilitated non-mediated and realistic emotional and behavioral responses, our results demonstrate VR’s high potential to increase the ecological validity of scientific findings (video abstract: https://www.youtube.com/watch?v=qROsPOp87l4&feature=youtu.be)
    corecore